Convergence of Polynomial Restart Krylov Methods for Eigenvalue Computations
نویسندگان
چکیده
Krylov subspace methods have proved effective for many non-Hermitian eigenvalue problems, yet the analysis of such algorithms is involved. Convergence can be characterized by the angle the approximating subspace forms with a desired invariant subspace, resulting in a geometric framework that is robust to eigenvalue ill-conditioning. This paper describes a new bound on this angle that handles the complexities introduced by non-Hermitian matrices, yet has a simpler derivation than similar previous bounds. The new bound suggests that ill-conditioning of the desired eigenvalues exerts little influence on convergence, while instability of unwanted eigenvalues plays an essential role. Practical considerations restrict the dimension of the approximating Krylov space; to obtain convergence, one refines the vector that generates the subspace by applying a polynomial filter. Such filters dynamically steer a low-dimensional Krylov space toward a desired invariant subspace. We address the design of these filters, and illustrate with examples the subtleties that arise when restarting non-Hermitian iterations.
منابع مشابه
Some new restart vectors for explicitly restarted Arnoldi method
The explicitly restarted Arnoldi method (ERAM) can be used to find some eigenvalues of large and sparse matrices. However, it has been shown that even this method may fail to converge. In this paper, we present two new methods to accelerate the convergence of ERAM algorithm. In these methods, we apply two strategies for the updated initial vector in each restart cycles. The implementation of th...
متن کاملApproximation of Largest Eigenpairs of Matrices and Applications to Pagerank Computation
In this work, we propose di erent approaches, for the treatment of the following problems: (i) computation of the largest eigenvalue of a matrix and the corresponding eigenvector when neither is known, (ii) computation of the eigenvector of a matrix corresponding to its largest eigenvalue when this eigenvalue is known. The matrix is arbitrary, large, and sparse. We treat the rst problem by Kryl...
متن کاملNON-POLYNOMIAL SPLINE FOR THE NUMERICAL SOLUTION OF PROBLEMS IN CALCULUS OF VARIATIONS
A Class of new methods based on a septic non-polynomial spline function for the numerical solution of problems in calculus of variations is presented. The local truncation errors and the methods of order 2th, 4th, 6th, 8th, 10th, and 12th, are obtained. The inverse of some band matrixes are obtained which are required in proving the convergence analysis of the presented method. Convergence anal...
متن کاملA Rational Krylov Method Based on Hermite Interpolation for Nonlinear Eigenvalue Problems
This paper proposes a new rational Krylov method for solving the nonlinear eigenvalue problem (NLEP): A(λ)x = 0. The method approximates A(λ) by Hermite interpolation where the degree of the interpolating polynomial and the interpolation points are not fixed in advance. It uses a companion-type reformulation to obtain a linear generalized eigenvalue problem (GEP). To this GEP we apply a rationa...
متن کاملProjection Methods for Nonlinear Sparse Eigenvalue Problems
This paper surveys numerical methods for general sparse nonlinear eigenvalue problems with special emphasis on iterative projection methods like Jacobi–Davidson, Arnoldi or rational Krylov methods and the automated multi–level substructuring. We do not review the rich literature on polynomial eigenproblems which take advantage of a linearization of the problem.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM Review
دوره 47 شماره
صفحات -
تاریخ انتشار 2005